136 research outputs found

    On the conditions for the existence of Perfect Learning and power law in learning from stochastic examples by Ising perceptrons

    Full text link
    In a previous letter, we studied learning from stochastic examples by perceptrons with Ising weights in the framework of statistical mechanics. Under the one-step replica symmetry breaking ansatz, the behaviours of learning curves were classified according to some local property of the rules by which examples were drawn. Further, the conditions for the existence of the Perfect Learning together with other behaviors of the learning curves were given. In this paper, we give the detailed derivation about these results and further argument about the Perfect Learning together with extensive numerical calculations.Comment: 28 pages, 43 figures. Submitted to J. Phys.

    Response to Invasion by Antigen and Effects of Threshold in an Immune Network Dynamical System Model with a Small Number of Degrees of Freedom

    Full text link
    We study a dynamical system model of an idiotypic immune network with a small number of degrees of freedom, mainly focusing on the effect of a threshold above which antibodies can recognise antibodies. The response of the system to invasions by antigens is investigated in the both models with and without the threshold and it turns out that the system changes in a desirable direction for moderate magnitude of perturbation. direction for moderate magnitude of perturbation. Also, the propagation of disturbance by an antigen is investigated in the system of one-dimensionally connected basic units taking the closed 3-clone system as a unit, and it is clarified that the threshold of the system has effects to enhance the stability of the network and to localise the immune response.Comment: 6 pages, 6 figures. Submitted to Prog. Theor. Phy

    Diagonalization of replicated transfer matrices for disordered Ising spin systems

    Full text link
    We present an alternative procedure for solving the eigenvalue problem of replicated transfer matrices describing disordered spin systems with (random) 1D nearest neighbor bonds and/or random fields, possibly in combination with (random) long range bonds. Our method is based on transforming the original eigenvalue problem for a 2n×2n2^n\times 2^n matrix (where n→0n\to 0) into an eigenvalue problem for integral operators. We first develop our formalism for the Ising chain with random bonds and fields, where we recover known results. We then apply our methods to models of spins which interact simultaneously via a one-dimensional ring and via more complex long-range connectivity structures, e.g. 1+∞1+\infty dimensional neural networks and `small world' magnets. Numerical simulations confirm our predictions satisfactorily.Comment: 24 pages, LaTex, IOP macro

    Thouless-Anderson-Palmer equation for analog neural network with temporally fluctuating white synaptic noise

    Full text link
    Effects of synaptic noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analog neural networks with temporally fluctuating synaptic noise, which is assumed to be white noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic network requires the knowledge of the Thouless-Anderson-Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very few, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with synaptic noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.Comment: 13 pages, 3 figure

    Hierarchical Self-Programming in Recurrent Neural Networks

    Full text link
    We study self-programming in recurrent neural networks where both neurons (the `processors') and synaptic interactions (`the programme') evolve in time simultaneously, according to specific coupled stochastic equations. The interactions are divided into a hierarchy of LL groups with adiabatically separated and monotonically increasing time-scales, representing sub-routines of the system programme of decreasing volatility. We solve this model in equilibrium, assuming ergodicity at every level, and find as our replica-symmetric solution a formalism with a structure similar but not identical to Parisi's LL-step replica symmetry breaking scheme. Apart from differences in details of the equations (due to the fact that here interactions, rather than spins, are grouped into clusters with different time-scales), in the present model the block sizes mim_i of the emerging ultrametric solution are not restricted to the interval [0,1][0,1], but are independent control parameters, defined in terms of the noise strengths of the various levels in the hierarchy, which can take any value in [0,\infty\ket. This is shown to lead to extremely rich phase diagrams, with an abundance of first-order transitions especially when the level of stochasticity in the interaction dynamics is chosen to be low.Comment: 53 pages, 19 figures. Submitted to J. Phys.

    Statistical Mechanics of Soft Margin Classifiers

    Full text link
    We study the typical learning properties of the recently introduced Soft Margin Classifiers (SMCs), learning realizable and unrealizable tasks, with the tools of Statistical Mechanics. We derive analytically the behaviour of the learning curves in the regime of very large training sets. We obtain exponential and power laws for the decay of the generalization error towards the asymptotic value, depending on the task and on general characteristics of the distribution of stabilities of the patterns to be learned. The optimal learning curves of the SMCs, which give the minimal generalization error, are obtained by tuning the coefficient controlling the trade-off between the error and the regularization terms in the cost function. If the task is realizable by the SMC, the optimal performance is better than that of a hard margin Support Vector Machine and is very close to that of a Bayesian classifier.Comment: 26 pages, 12 figures, submitted to Physical Review

    Slowly evolving random graphs II: Adaptive geometry in finite-connectivity Hopfield models

    Full text link
    We present an analytically solvable random graph model in which the connections between the nodes can evolve in time, adiabatically slowly compared to the dynamics of the nodes. We apply the formalism to finite connectivity attractor neural network (Hopfield) models and we show that due to the minimisation of the frustration effects the retrieval region of the phase diagram can be significantly enlarged. Moreover, the fraction of misaligned spins is reduced by this effect, and is smaller than in the infinite connectivity regime. The main cause of this difference is found to be the non-zero fraction of sites with vanishing local field when the connectivity is finite.Comment: 17 pages, 8 figure
    • …
    corecore